Feature Weighting for Lazy Learning Algorithms
نویسنده
چکیده
Learning algorithms diier in the degree to which they process their inputs prior to their use in performance tasks. Many algorithms eagerly compile input samples and use only the compilations to make decisions. Others are lazy: they perform less precompilation and use the input samples to guide decision making. The performance of many lazy learners signiicantly degrades when samples are deened by features containing little or misleading information. Distinguishing feature relevance is a critical issue for these algorithms, and many solutions have been developed that assign weights to features. This chapter introduces a cat-egorization framework for feature weighting approaches used in lazy similarity learners and brieey surveys some examples in each category. 1.1 INTRODUCTION Lazy learning algorithms are machine learning algorithms (Mitchell, 1997) that are welcome members of procrastinators anonymous. Purely lazy learners typically display the following characteristics (Aha, 1997): 1. Defer: They delay the processing of their inputs until they receive requests for information; they simply store their inputs for future use. 2. Demand-Driven: They reply to information queries by combining information from their stored (e.g., training) samples. 3. Discard: They delete the constructed query and any intermediate results. In contrast, eager algorithms greedily replace their inputs with an abstraction (e.g., a rule set, decision tree, or neural network) and use it to process queries.
منابع مشابه
Evolutionary feature weighting to improve the performance of multi-label lazy algorithms
In the last decade several modern applications where the examples belong to more than one label at a time have attracted the attention of research into machine learning. Several derivatives of the k-nearest neighbours classifier to deal with multi-label data have been proposed. A k-nearest neighbours classifier has a high dependency with respect to the definition of a distance function, which i...
متن کاملA Minimum Risk Metric for Nearest Neighbor Classification
nale. Retrieval in a prototype-based case library: A case study in diabetes therapy revision. CH97] C. Cardie and N. Howe. Improving minority class prediction using case-speciic feature weight. CS93] Scott Cost and Steven Salzberg. A weighted nearest neighbor algorithm for learning with symbolic features. DP97] Pedro Domingos and Michael Pazzani. On the optimality of the simple bayesian clas-si...
متن کاملPrediction of daily precipitation of Sardasht Station using lazy algorithms and tree models
Due to the heterogeneous distribution of precipitation, predicting its occurrence is one of the primary and basic solutions to prevent possible disasters and damages caused by them. Considering the high amount of precipitation in Sardasht County, the people of this city turning to agriculture in recent years and not using classification models in the studied station, it is necessary to predict ...
متن کاملA Lazy Approach for Machine Learning Algorithms
Most machine learning algorithms are eager methods in the sense that a model is generated with the complete training data set and, afterwards, this model is used to generalize the new test instances. In this work we study the performance of different machine learning algorithms when they are learned using a lazy approach. The idea is to build a classification model once the test instance is rec...
متن کاملA lazy learning approach for building classification models
In this paper we propose a lazy learning strategy for building classification learning models. Instead of learning the models with the whole training data set before observing the new instance, a selection of patterns is made depending on the new query received and a classification model is learnt with those selected patterns. The selection of patterns is not homogeneous, in the sense that the ...
متن کاملذخیره در منابع من
با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید
عنوان ژورنال:
دوره شماره
صفحات -
تاریخ انتشار 1998